|
|
Activity Distinction using Android Sensors |
| Tags | other☁android☁opensignals mobile☁activity distinction |
The OpenSignals mobile application allows for acquisition of biosignals using PLUX sensors. Additionally, the App enables you to acquire data from the internal sensors of the phone on which the application is running. On the other hand, the App also offers a simple and straightforward way to record data from a variety of sensors.
A possible use case for the OpenSignals mobile application would be, for example, to record data from patients or subjects while they are at home. This usage is particularly useful when patients/subjects are not able to come to the research facility due to either personal or governmental restrictions. Furthermore, it offers the possibility to record data while the patient/subject is in his/her natural daily environment.
When patients/subjects do recordings at home over longer periods of time, usually the patient/subject is asked to keep a daily journal to write down all the activities that have been performed during the day. These journals however, may not be absolutely accurate and thus do not offer enough time resolution to reliably distinguish the start and end times of activities in relation to the recorded signals.
To overcome these constraints,
Android
sensor data can be helpful to infer the activities that the patient/subject performed throughout the recording time. Using
Android"s
internal sensors this is a practical solution since it is not necessary to equip the patient/subject with additional sensors and the subject/patient usually carries the device throughout the day.
In this Jupyter Notebook we will show you that the data acquired from Android sensors can be leveraged to visually distinguish between four different activities, namely between:
In case this is your first time working with
Android
sensors, we highly recommend reading this
notebook
which provides you all general information on
Android
sensors that you need to know.
1 - Equipment
To showcase how an exemplary recording of a patient"s/subject"s data could look like, not only data from internal Android sensors was recorded, but also from two PLUX sensors.The following PLUX devices/sensors were used:
The
cardioBAN
is a wearable device that consists of a shoulder-chest harness strap with an integrated ECG sensor and a 3-axis accelerometer. The ECG uses three dry electrodes that are integrated during the fabrication of the chest strap. This wearable allows for a comfortable acquisition of cardiac signals and basic motion data in dynamic conditions.
The
SpO2 (versatile)
is a digital sensor that is designed for peripheral capillary oxygen saturation level estimation using two LED"s, one in the
red
region and the other in the
infrared
region of the spectrum. Its design allows to place the sensor on body regions that are not the traditional finger clip or ear lobe placements. The sensor was plugged into the digital port of the
cardioBAN
. In order to be able to read data from the digital sensor, the
cardioBAN
has to be updated to the newest firmware. In case you do not know how to achieve this, please consult the
OpenSignals manual
.
Android data was acquired using a Samsung Galaxy A40 . Data from the following sensors was recorded:
The choice of these three sensors is equivalent to using an Inertial Measurement Unit (IMU) . These units are commonly used for movement tracking and thanks to the Android sensor functionality within the OpenSignals mobile application we have such a powerful tool readily available at our disposal.
All sensors were set to a sampling rate of 100 Hz in order to facilitate the synchronisation process.
2 - Sensor placement
Throughout all recordings, the sensors and the phone were placed as follows:
The image below illustrates the placement of the sensors. On the left side, the placement of the
PLUX
sensors is shown. For illustration purposes the individual components, the
ECG electrodes
, the
accelerometer
, and the
SpO2
sensors are highlighted in different colors. The image depicts the
SpO2
placement without the headband that was used to occlude the sensor from external light sources. On the right side, the placement of the phone within the front pocket is displayed.
3 - Performed Activities and signal recording
As described at the beginning of this Jupyter Notebook four different activities were performed. All these activities were recorded for approximately 3 minutes and were executed in the following way:
In order to synchronise the signals from the
PLUX
and
Android
sensors, a similar approach to the
Synchronising Android and PLUX sensors
Jupyter Notebook
was employed. However, in this case the distinctive event was not generated on the z-axis of both devices but on the y-axis of both devices. The distinctive event was generated by simply jumping up and down. To ensure that the y-axes of both devices were properly aligned, at the beginning of each recording the phone was held in front of the chest in such a manner that the y-axes of both devices pointed towards the same direction (upwards). Then the subject proceeded to jump up and down to generate the distinctive event used for synchronisation. Afterwards the phone was placed in the right front pocket of the subjects trousers and the subject started to perform the activity for 3 minutes as described above.
4 - Data pre-processing
Before the signals were plotted for this Jupyter Notebook , some pre-processing steps were performed. These included:
4 - Comparing the signals of the recorded activities
Now that we have all of the important information out of the way, we can start having a look at the signals that each activity generates. In this section we will compare all of these signals and point out the visual differences that let us distinguish between each activity. As we will see, the activities will not necessarily be distinguishable based alone on the ECG and SpO2 data recorded using the PLUX sensors. However, we will see that the signals gathered from the internal Android sensors will allow us to make this distinction.
4.1 - Sitting
Below the signals that are generated from sitting are shown. Through a manual estimate, we can see that the heart rate (if we count the peaks) is around 65 bpm. The readings from the red and infrared LEDs of the SpO2 sensor follow roughly the same path, with the difference between the two being that they are slightly shifted in their intensity. This indicates that blood oxygenation levels are most likely above 90 %.Looking at the Android sensor data, we can observe that there are almost no changes within the respective axes of each sensor.
The accelerometer shows values that are expected. The acceleration along the x-axis is above 8 m/s 2 . This makes sense because the phones x-axis is looking upwards while sitting. The fact that it does not measure exactly 9.81 m/s 2 is probably due to the circumstance that the phone is either slightly tilted forward or backward in the pocket or the subject"s thighs are not perpendicular to the chair. This tilt can also be observed in the acceleration along the y- and z-axis of the phone. Here we can see that the acceleration along the y-axis is slightly below 0, at roughly -1.6 m/s 2 and along the z-axis it is around -6.4 m/s 2 . Having all this information in mind we can thus assume that the phone is slightly tilted forward along the x-y-plane (assuming mathematical rotation) and also slightly tilted towards the leg with respect to the z-axis . It is plausible to assume that the phone is tilted along the z-axis because while sitting the fabric of the trousers is more stressed and thus the phone is pressed towards the leg.
The gyroscope data shows that the subject was sitting quite calmly during the recording. All three channels show only minor deviations from the origin.
Finally, the magnetometer indicates the same as the gyroscope. The subject did not perform major movements while seated. Since the magnetometer measures the geomagnetic field strength, the field strength does not change much when the subject doesn"t move.
4.2 - Lying down
Looking at the signals from when the subject is lying down, they do not look to much different from sitting at first glance. The heart rate is around 61 bpm and the SpO2 sensor returns values in a similar range.The Android signals do not look so different either. However, on closer inspection we can distinguish some slight differences.
Taking a look at the accelerometer signals reveals that the acceleration along the y-axis is much closer to zero. Also, the acceleration around the z-axis is shifted a bit upwards (from roughly -6.4 m/s 2 to approximately -5.2 m/s 2 ). This makes sense, because when the subject is lying down, the phone is much more aligned with leg and as a consequence the surface on which the subject is lying on. However, one has to keep in mind that when the surface is tilted, so will also be the sensor reading from the phone.
The gyroscope signals do not show much difference to the ones when the subject is sitting. This is plausible, because the subject has not moved to much while performing both of these activities.
The magnetometer data also doesn"t show to much variability for the geomagnetic field along each axis. Yet, there is one difference that we can make out between both of these recordings. The geomagnetic field reading along the y-axis changed from around 11 $\mu$T to -22 $\mu$T and along the z-axis a change from roughly 2.6 $\mu$T to 9 $\mu$T can be observed. This difference, however, does unfortunately not help us with distinguishing both activities. In this case the difference gives us more a hint into which direction the subject is facing. When the subject was performing the sitting activity, the phone"s y-axis was approximately facing west, while during lying down it was facing roughly south. Please keep in mind, that the readings for the geomagnetic field may differ based on your location on the earth and can also be influenced by objects that generate their own electromagnetic field.
4.3- Standing
The signals that are generated while standing are shown below. Here we can make a clearer distinction between the previous two activities. The heart rate slightly increased to roughly 79 bpm. A slight increase in heart rate is usually observed when changing from a lying or sitting position to a standing position. However, it is important to note that a slight increase in heart rate does not necessarily mean that a person is standing. Thus, this can not be reliably used as an indicator for distinguishing this activity. For the SpO2 sensor we still observe the same range in values as before.The accelerometer data, however, shows a clear distinction between the acceleration along its respective axes. Mainly, it can be observed that now the acceleration along the y-axis is approximately the same value as the earths gravity (g $\approx$ 9.81 m/s 2 ), just in the negative direction. This of course makes sense because when standing the y-axis of the phone points towards the floor. The acceleration along the x- and z-axis is close to 1 or -1 because both of these axes are nearly perpendicular to the floor, with some minor shifts due to the position of the phone within the front pocket of the subject"s trousers.
We can also see a higher variability within the gyroscope signals. This is due to the fact that when someone is standing in an upright position (assuming they are not leaning against a wall or are supported by something), the person usually tends to teeter slightly (i.e. moving the upper body back and forth and to the sides).
The same as for the last two activities can be said about the magnetometer for this activity. Since the subject is not moving all to much, the readings can not be directly used to distinguish the activity. The information that can be retrieved during activities in which the subject does not move much are more in a sense of orientation in relation to the earths magnetic field.
4.4- Walking
Finally, walking does clearly show a stark difference to the other three activities. Differences can be seen directly in most of the signals. The ECG clearly shows distortions in the signals. These distortions are movement artifacts that are a result of the dry electrodes slightly moving during walking. The heart rate is approximately at 83 bpm. The SpO2 sensor signals are still in the same range as before.Data from the Android sensors have a periodicity that can be clearly observed. It is reasonable to assume that the periodicity is directly linked to the speed at which steps are taken during walking. Thus, it can not only be observed that the subject is walking but it should also be possible to make an estimation on how many steps the subject has done during the recording. From this an estimate on the distance and velocity that the subject walked could also be made (assuming a fixed step distance).
5 - Activity distinction a perfect problem for machine learning
Doing a visual examination of the data and then trying to distinguish when a certain activity starts or ends is of course a task that is far to time consuming and exhausting. However, here machine learning would be a perfect tool to solve the task. Assuming one collects enough data, the sensor data could be used to build a classifier. This classifier should then be able to offer an accurate and reliable activity tracking using the data that the patient records throughout the recording process.Activity distinction or, to name it in machine learning terms, activity recognition is a problem that is still researched in the scientific community. There exist several papers that try to tackle this problem. We recommend reading the following papers to get a good overview of the topic and also get knowledge on different approaches.
Of course, when tackling this problem one does not have to restrict oneself to only using the accelerometer, gyroscope, and magnetometer. The Android system does offer a great variety of other sensors. There would be possibilities to also use the light and proximity sensors in order to distinguish when the patient/subject is pulling the phone out of the pocket. Integration of GPS data could be used to distinguish between sitting in front of a desk, in a car, a train, etc. The OpenSignals mobile application also allows you to include biosignals, such EMG, ECG, EMG, fNIRS, and many more into your model. The possibilities are basically endless!
In this Jupyter notebook we showed that Android sensors can be leveraged to distinguish between different activities such as sitting, lying down, standing, and walking.
We hope that you have enjoyed this guide
.
biosiganlsnotebooks
is an environment in continuous expansion, so don"t stop your journey and learn more with the remaining
Notebooks
.